perm filename COMMON.ABS[W83,JMC]1 blob sn#705093 filedate 1983-04-02 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00003 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	See common[f82,jmc]
C00010 00003	MYCIN
C00014 ENDMK
C⊗;
See common[f82,jmc]
What is common sense and what programs need it

Abstract: Common sense thinking combines certain general knowledge about
the world with the facts of a particular situation in order to decide what
to do.  Even after thirty years of research in artificial intelligence, we
still don't have any good picture of what knowledge and what reasoning
ability is involved.  While many impressive and useful artificial programs
don't have or need much common sense ability, general intelligence
requires it, and so do many important AI programs.  Moreover, until
programs have some general common sense, a human will have to use his
common sense to determine whether a program is usable in a given
situation.

Introduction

	Most present AI programs don't have common sense knowledge in
the sense to be discussed in this paper.  Instead they use production
rules or something similar.  These rules match patterns with a database
and when they find a match use the action part of the rule to decide
what to do.

We will try to identify what knowledge and what reasoning methods are
required for common sense.  We will also try to identify what practical AI
tasks require it.  This constitutes part of an agenda for AI research.



	It is now widely agreed among AI researchers that understanding
common sense knowledge and reasoning is a key problem.
Although many "ecological niches" exist for expert systems without
common sense, present programs are limited and brittle.

Here are some of the components of common sense as I see it.

1. The ability to represent declarative knowledge in a way independent
of particular goals and to use this knowledge in pursuit of whatever
goals it may be relevant to.  Both knowledge of particular situations
and general knowledge expressed by quantified sentences are required.

2. Knowledge of events occurring in time, of situations or partial
situations and partial knowledge of situations.  The effects of
actions and other events.

3. Facts about goals and their achievement.  Even when the goals are
intellectual, e.g. the solution of a mathematics problem, time,
events and actions are involved in their achievement.

4. Concurrent events.

5. Other people, their knowledge, beliefs and goals.  Other attitudes
such as hopes and fears, likes and dislikes.

6. Space and objects in space, their persistence in time, their creation
and destruction.  Extended objects, shapes.  Rigid and flexible objects.
Substances.

7. Quantities and measurement.

8. The world described by science and its connection with the
world as described by common sense.

9. Epistemology.  A theory of how knowledge is obtained.

10. Communication.  How utterances are to be understood as making assertions.
How the behavior of other people is affected by communication.

11. Knowledge of abstract entities and their relation to concrete ones.

	It seems to be a formidable task to build a computer program
with all the above abilities.  Perhaps they can be arranged in a tree
so that one can make programs with the more basic capabilities and
debug them, leaving the others for later.

dec 6

	It is worthwhile to distinguish levels of thinking.

Level 1 - Situation → action, e.g. by a production rule.

Level 2 - Direct planning. Goal ∧ situation → action, but action
has preconditions forming subgoals.

Level 3 - When direct planning loses, then there arises the goal
of forming a plan.

	Perhaps the second of these is the prolog level.  The higher levels
may be implemented by applying a lower level method to a database
that includes meta level information.

	1. In any situation certain objects are present and have
certain properties and certain relations to each other.

	2. A person or robot has certain opportunities to observe,
and there are relations between the properties of the objects present
and what can be observed.

	3. Events occur and situations change.  New situations arise
that are related to the old situations and the events that occur.

	4. People and robots have a choice of actions, and the
different actions have different consequences.

	5. What people do depends on their goals and their beliefs.

	6. Sometimes we can consider that a new situation is
uniquely determined by the occurrence of an action or other event.
More generally, it is necessary to take into account the effects
of concurrent processes.

	Many computer programs developed in artificial intelligence
research have impressive capabilities, but none of them have all of
certain common sense capabilities possessed by any normal human.
We are interested in three kinds of capability.

	1. The ability to represent certain kinds of facts in their
brains.  Notice that the ability to represent facts is a prerequisite
to learning them from experience or teachers or to use them.  These
facts include the following kinds:

	a. Facts about kinds of objects and other entities.
Examples: 
	The book is on the table.
	A book has pages and writing in it.

MYCIN

	Shortliffe 7-6979
Guest account
directory = consult
password = expert

	The experimental
 MYCIN expert system for recommending treatment of bacterial
infections of the blood is an example of an AI program without common
sense that is nevertheless useful.  Of course, to say that a person
is without common sense is a denunciation, and maybe someday it will
also be a denunciation of a program, but for now we mean it in the
technical sense.

	MYCIN conducts a dialog with a phsyician about
a patient who is suspected of having a bacterial infection.  MYCIN
asks about symptoms and tests and recommends further
tests and treatments.

	While it doesn't use first order logic, it still seems reasonable
to speak of its ontology, the kinds of entities it discusses with
the user and computes about internally.  These include diseases,
symptoms, tests, results of tests, antibiotics, treatments and
probabilities.  It includes the name, age and sex of the patient
but it uses these facts in very limited ways.

	MYCIN is a "production system"

	The 1983 March MYCIN turns out to have the following bug.  Even
if the patient has been identified as male, it will accept "amniotic
fluid" as a suspected locus of infection.  This bug is irrelevant
to the applications, since the user presumably knows better.
Changing MYCIN to eliminate the bug is not very difficult
for someone who understands the working of the program, since it
only involves modifying xx productions and adding one production.
However, the bug cannot be eliminated without reading at least
part of the program.  A human or a program with full use of
logic can accept such correction by someone understanding only
an external language.  Namely, the program can be told that
only pregnant females have amniotic fluid.

To what extent does MYCIN's lack of common sense interfere with its
usefulness?  Is it perhaps even as weak as John Ryder's go program
which looks ok as long as you play reasonably, but which can be
led into absurdity.

Get this information from doctors who have worked with MYCIN, starting
with Bob Blum.

1. Does MYCIN need prognostic ability to be useful?

2. Does it need to be able to detect absurdities such as pregnant
2 year old males?  Probably not.